Granular Elastic Network Regression with Stochastic Gradient Descent

نویسندگان

چکیده

Linear regression is the use of linear functions to model relationship between a dependent variable and one or more independent variables. models have been widely used in various fields such as finance, industry, medicine. To address problem that traditional difficult handle uncertain data, we propose granule-based elastic network model. First construct granules granular vectors by granulation methods. Then, define multiple operation rules so can effectively data. Further, norm vector are defined design loss function After that, conduct derivative gradient descent optimization algorithm. Finally, performed experiments on UCI datasets verify validity elasticity network. We found has advantage good fit compared with

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variations of Logistic Regression with Stochastic Gradient Descent

In this paper, we extend the traditional logistic regression model(LR) to the bounded logistic regression model(BLR) and compare them. We also derive the update rules of both model using stochastic gradient desent(SGD). The effects of choosing different learning rate schedule, stopping conditions, parameters initialization and learning algorithm settings are also discussed. We get the accuracy ...

متن کامل

Stochastic Gradient Descent with GPGPU

We show how to optimize a Support Vector Machine and a predictor for Collaborative Filtering with Stochastic Gradient Descent on the GPU, achieving 1.66 to 6-times accelerations compared to a CPUbased implementation. The reference implementations are the Support Vector Machine by Bottou and the BRISMF predictor from the Netflix Prices winning team. Our main idea is to create a hash function of ...

متن کامل

Accelerating deep neural network training with inconsistent stochastic gradient descent

Stochastic Gradient Descent (SGD) updates Convolutional Neural Network (CNN) with a noisy gradient computed from a random batch, and each batch evenly updates the network once in an epoch. This model applies the same training effort to each batch, but it overlooks the fact that the gradient variance, induced by Sampling Bias and Intrinsic Image Difference, renders different training dynamics on...

متن کامل

Recurrent neural network training with preconditioned stochastic gradient descent

Recurrent neural networks (RNN), especially the ones requiring extremely long term memories, are difficult to training. Hence, they provide an ideal testbed for benchmarking the performance of optimization algorithms. This paper reports test results of a recently proposed preconditioned stochastic gradient descent (PSGD) algorithm on RNN training. We find that PSGD may outperform Hessian-free o...

متن کامل

Lazy Sparse Stochastic Gradient Descent for Regularized Mutlinomial Logistic Regression

Stochastic gradient descent efficiently estimates maximum likelihood logistic regression coefficients from sparse input data. Regularization with respect to a prior coefficient distribution destroys the sparsity of the gradient evaluated at a single example. Sparsity is restored by lazily shrinking a coefficient along the cumulative gradient of the prior just before the coefficient is needed. 1...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2022

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math10152628